Probability Theory

Notation

Function

$$ \begin{aligned} p_{X}(x) &= P[X = x] \end{aligned} $$ $$ \begin{aligned} p_{X, Y}(x, y) = P[X = x, Y = y] \\ \begin{cases} p_{X}(x) = \sum_{y \in \mathcal{Y}} p_{X, Y}(x, y) \\ p_{Y}(y) = \sum_{x \in \mathcal{X}} p_{X, Y}(x, y) \end{cases} \end{aligned} $$

Independence

$$ \begin{aligned} p_{X, Y}(x, y) = p_{X}(x) \cdot p_{Y}(y) \end{aligned} $$ $$ \begin{aligned} p_{X_1, \ldots, X_n}(x_1, \ldots, x_n) = \prod_{i=1}^{n} p_{X}(x_i) \end{aligned} $$

Expectation

$$ \begin{aligned} E[X] = \sum_{x \in \mathcal{X}} x \cdot p_{X}(x) \end{aligned} $$ $$ \begin{aligned} \lim_{n \rightarrow \infty} \left( \frac{1}{n} \sum_{i=1}^{n} X_i \right) = E[X] \end{aligned} $$

Variance

$$ \begin{aligned} Var(X) = E \left[ (X - E[X])^2 \right] \end{aligned} $$ $$ \begin{aligned} \lim_{n \rightarrow \infty} \left[ \frac{1}{\sqrt{n}} \sum_{i=1}^{n} (X_i - \mu) \right] = \mathcal{N}(0, Var(X)) \end{aligned} $$

by Jon